37 research outputs found
High Dimensional Consistent Digital Segments
We consider the problem of digitalizing Euclidean line segments from R^d to Z^d. Christ {et al.} (DCG, 2012) showed how to construct a set of {consistent digital segments} (CDS) for d=2: a collection of segments connecting any two points in Z^2 that satisfies the natural extension of the Euclidean axioms to Z^d. In this paper we study the construction of CDSs in higher dimensions.
We show that any total order can be used to create a set of {consistent digital rays} CDR in Z^d (a set of rays emanating from a fixed point p that satisfies the extension of the Euclidean axioms). We fully characterize for which total orders the construction holds and study their Hausdorff distance, which in particular positively answers the question posed by Christ {et al.}
Computational Complexity of the ?-Ham-Sandwich Problem
?_d from each set. Steiger and Zhao [DCG 2010] proved a discrete analogue of this theorem, which we call the ?-Ham-Sandwich theorem. They gave an algorithm to find the hyperplane in time O(n (log n)^{d-3}), where n is the total number of input points. The computational complexity of this search problem in high dimensions is open, quite unlike the complexity of the Ham-Sandwich problem, which is now known to be PPA-complete (Filos-Ratsikas and Goldberg [STOC 2019]).
Recently, Fearnley, Gordon, Mehta, and Savani [ICALP 2019] introduced a new sub-class of CLS (Continuous Local Search) called Unique End-of-Potential Line (UEOPL). This class captures problems in CLS that have unique solutions. We show that for the ?-Ham-Sandwich theorem, the search problem of finding the dividing hyperplane lies in UEOPL. This gives the first non-trivial containment of the problem in a complexity class and places it in the company of classic search problems such as finding the fixed point of a contraction map, the unique sink orientation problem and the P-matrix linear complementarity problem
A Generalization of Self-Improving Algorithms
Ailon et al. [SICOMP'11] proposed self-improving algorithms for sorting and
Delaunay triangulation (DT) when the input instances follow
some unknown \emph{product distribution}. That is, comes from a fixed
unknown distribution , and the 's are drawn independently.
After spending time in a learning phase, the subsequent
expected running time is , where , and and are the
entropies of the distributions of the sorting and DT output, respectively. In
this paper, we allow dependence among the 's under the \emph{group product
distribution}. There is a hidden partition of into groups; the 's
in the -th group are fixed unknown functions of the same hidden variable
; and the 's are drawn from an unknown product distribution. We
describe self-improving algorithms for sorting and DT under this model when the
functions that map to 's are well-behaved. After an
-time training phase, we achieve and
expected running times for sorting and DT,
respectively, where is the inverse Ackermann function
Distance Bounds for High Dimensional Consistent Digital Rays and 2-D Partially-Consistent Digital Rays
We consider the problem of digitalizing Euclidean segments. Specifically, we look for a constructive method to connect any two points in Zd. The construction must be consistent (that is, satisfy the natural extension of the Euclidean axioms) while resembling them as much as possible. Previous work has shown asymptotically tight results in two dimensions with Θ(logN) error, where resemblance between segments is measured with the Hausdorff distance, and N is the L1 distance between the two points. This construction was considered tight because of a Ω(logN) lower bound that applies to any consistent construction in Z2. In this paper we observe that the lower bound does not directly extend to higher dimensions. We give an alternative argument showing that any consistent construction in d dimensions must have Ω(log1/(d−1)N) error. We tie the error of a consistent construction in high dimensions to the error of similar weak constructions in two dimensions (constructions for which some points need not satisfy all the axioms). This not only opens the possibility for having constructions with o(logN) error in high dimensions, but also opens up an interesting line of research in the tradeoff between the number of axiom violations and the error of the construction. A side result, that we find of independent interest, is the introduction of the bichromatic discrepancy: a natural extension of the concept of discrepancy of a set of points. In this paper, we define this concept and extend known results to the chromatic setting
Distance Bounds for High Dimensional Consistent Digital Rays and 2-D Partially-Consistent Digital Rays
We consider the problem of digitalizing Euclidean segments. Specifically, we
look for a constructive method to connect any two points in . The
construction must be {\em consistent} (that is, satisfy the natural extension
of the Euclidean axioms) while resembling them as much as possible. Previous
work has shown asymptotically tight results in two dimensions with error, where resemblance between segments is measured with the Hausdorff
distance, and is the distance between the two points. This
construction was considered tight because of a lower bound
that applies to any consistent construction in .
In this paper we observe that the lower bound does not directly extend to
higher dimensions. We give an alternative argument showing that any consistent
construction in dimensions must have error. We
tie the error of a consistent construction in high dimensions to the error of
similar {\em weak} constructions in two dimensions (constructions for which
some points need not satisfy all the axioms). This not only opens the
possibility for having constructions with error in high dimensions,
but also opens up an interesting line of research in the tradeoff between the
number of axiom violations and the error of the construction. In order to show
our lower bound, we also consider a colored variation of the concept of
discrepancy of a set of points that we find of independent interest
Rectilinear Link Diameter and Radius in a Rectilinear Polygonal Domain
We study the computation of the diameter and radius under the rectilinear
link distance within a rectilinear polygonal domain of vertices and
holes. We introduce a \emph{graph of oriented distances} to encode the distance
between pairs of points of the domain. This helps us transform the problem so
that we can search through the candidates more efficiently. Our algorithm
computes both the diameter and the radius in time, where denotes the matrix
multiplication exponent and is the number of
edges of the graph of oriented distances. We also provide a faster algorithm
for computing the diameter that runs in time